Importance Sampled Learning Ensembles

نویسندگان

  • Jerome H. Friedman
  • Bogdan E. Popescu
چکیده

Learning a function of many arguments is viewed from the perspective of high– dimensional numerical quadrature. It is shown that many of the popular ensemble learning procedures can be cast in this framework. In particular randomized methods, including bagging and random forests, are seen to correspond to random Monte Carlo integration methods each based on particular importance sampling strategies. Non random boosting methods are seen to correspond to deterministic quasi Monte Carlo integration techniques. This view helps explain some of their properties and suggests modifications to them that can substantially improve their accuracy while dramatically improving computational performance.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ISCLEs: Importance Sampled Circuit Learning Ensembles for Trustworthy Analog Circuit Topology Synthesis

Importance Sampled Circuit Learning Ensembles (ISCLEs) is a novel analog circuit topology synthesis method that returns designertrustworthy circuits yet can apply to a broad range of circuit design problems including novel functionality. ISCLEs uses the machine learning technique of boosting, which does importance sampling of “weak learners” to create an overall circuit ensemble. In ISCLEs, the...

متن کامل

Soft Rule Ensembles for Statistical Learning

In this article supervised learning problems are solved using soft rule ensembles. We first review the importance sampling learning ensembles (ISLE) approach that is useful for generating hard rules. The soft rules are obtained with logistic regression using the corresponding hard rules. Soft rules are useful when both the response and the input variables are continuous because the soft rules p...

متن کامل

Actively Exploring Creation of Face Space(s) for Improved Face Recognition

We propose a learning framework that actively explores creation of face space(s) by selecting images that are complementary to the images already represented in the face space. We also construct ensembles of classifiers learned from such actively sampled image sets, which further provides improvement in the recognition rates. We not only significantly reduce the number of images required in the...

متن کامل

Learning Sparse Structured Ensembles with SG-MCMC and Network Pruning

An ensemble of neural networks is known to be more robust and accurate than an individual network, however usually with linearly-increased cost in both training and testing. In this work, we propose a two-stage method to learn Sparse Structured Ensembles (SSEs) for neural networks. In the first stage, we run SG-MCMC with group sparse priors to draw an ensemble of samples from the posterior dist...

متن کامل

Monte Carlo Sampling in Path Space: Calculating Time Correlation Functions by Transforming Ensembles of Trajectories

Computational studies of processes in complex systems with metastable states are often complicated by a wide separation of time scales. Such processes can be studied with transition path sampling, a computational methodology based on an importance sampling of reactive trajectories capable of bridging this time scale gap. Within this perspective, ensembles of trajectories are sampled and manipul...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003